Recovery Conditions and Sampling Strategies for Network Lasso
نویسندگان
چکیده
The network Lasso is a recently proposed convex optimization method for machine learning from massive network structured datasets, i.e., big data over networks. It is a variant of the well-known least absolute shrinkage and selection operator (Lasso), which is underlying many methods in learning and signal processing involving sparse models. Highly scalable implementations of the network Lasso can be obtained by state-of-the art proximal methods, e.g., the alternating direction method of multipliers (ADMM). By generalizing the concept of the compatibility condition put forward by van de Geer and Bühlmann as a powerful tool for the analysis of plain Lasso, we derive a sufficient condition, i.e., the network compatibility condition, on the underlying network topology such that network Lasso accurately learns a clustered underlying graph signal. This network compatibility condition, relates the the location of the sampled nodes with the clustering structure of the network. In particular, the NCC informs the choice of which nodes to sample, or in machine learning terms, which data points provide most information if labeled.
منابع مشابه
When Is Network Lasso Accurate?
The “least absolute shrinkage and selection operator” (Lasso) method has been adapted recently for network-structured datasets. In particular, this network Lasso method allows to learn graph signals from a small number of noisy signal samples by using the total variation of a graph signal for regularization. While efficient and scalable implementations of the network Lasso are available, only l...
متن کاملThe Log-linear Group-Lasso Estimator and Its Asymptotic Properties
We define the group-lasso estimator for the natural parameters of the exponential families of distributions representing hierarchical log-linear models under multinomial sampling scheme. Such estimator arises as the solution of a convex penalized likelihood optimization problem based on the group-lasso penalty. We illustrate how it is possible to construct an estimator of the underlying log-lin...
متن کاملDistributed and Cooperative Compressive Sensing Recovery Algorithm for Wireless Sensor Networks with Bi-directional Incremental Topology
Recently, the problem of compressive sensing (CS) has attracted lots of attention in the area of signal processing. So, much of the research in this field is being carried out in this issue. One of the applications where CS could be used is wireless sensor networks (WSNs). The structure of WSNs consists of many low power wireless sensors. This requires that any improved algorithm for this appli...
متن کاملBlock Regularized Lasso for Multivariate Multi-Response Linear Regression
The multivariate multi-response (MVMR) linear regression problem is investigated, in which design matrices are Gaussian with covariance matrices Σ = ( Σ, . . . ,Σ ) for K linear regressions. The support union of K p-dimensional regression vectors (collected as columns of matrix B∗) are recovered using l1/l2-regularized Lasso. Sufficient and necessary conditions to guarantee successful recovery ...
متن کاملSharp Threshold for Multivariate Multi-Response Linear Regression via Block Regularized Lasso
for K linear regressions. The support union of K p-dimensional regression vectors (collected as columns of matrix B∗) is recovered using l1/l2-regularized Lasso. Sufficient and necessary conditions on sample complexity are characterized as a sharp threshold to guarantee successful recovery of the support union. This model has been previously studied via l1/l∞regularized Lasso by Negahban & Wain...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1709.01402 شماره
صفحات -
تاریخ انتشار 2017